AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
int4 Quantized Inference

# int4 Quantized Inference

Vision 8B MiniCPM 2 5 Uncensored And Detailed 4bit
The int4 quantized version of MiniCPM-Llama3-V 2.5, significantly reducing GPU VRAM usage (approximately 9GB)
Text-to-Image Transformers
V
sdasd112132
330
30
Minicpm Llama3 V 2 5 Int4
The int4 quantized version of MiniCPM-Llama3-V 2.5 significantly reduces GPU VRAM usage to approximately 9GB, suitable for visual question answering tasks.
Text-to-Image Transformers
M
openbmb
17.97k
73
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase